32 resultados para MULTICOMMUTED FLOW ANALYSIS

em Queensland University of Technology - ePrints Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data flow analysis techniques can be used to help assess threats to data confidentiality and integrity in security critical program code. However, a fundamental weakness of static analysis techniques is that they overestimate the ways in which data may propagate at run time. Discounting large numbers of these false-positive data flow paths wastes an information security evaluator's time and effort. Here we show how to automatically eliminate some false-positive data flow paths by precisely modelling how classified data is blocked by certain expressions in embedded C code. We present a library of detailed data flow models of individual expression elements and an algorithm for introducing these components into conventional data flow graphs. The resulting models can be used to accurately trace byte-level or even bit-level data flow through expressions that are normally treated as atomic. This allows us to identify expressions that safely downgrade their classified inputs and thereby eliminate false-positive data flow paths from the security evaluation process. To validate the approach we have implemented and tested it in an existing data flow analysis toolkit.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

System analysis within the traction power system is vital to the design and operation of an electrified railway. Loads in traction power systems are often characterised by their mobility, wide range of power variations, regeneration and service dependence. In addition, the feeding systems may take different forms in AC electrified railways. Comprehensive system studies are usually carried out by computer simulation. A number of traction power simulators have been available and they allow calculation of electrical interaction among trains and deterministic solutions of the power network. In the paper, a different approach is presented to enable load-flow analysis on various feeding systems and service demands in AC railways by adopting probabilistic techniques. It is intended to provide a different viewpoint to the load condition. Simulation results are given to verify the probabilistic-load-flow models.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents the simulation model development of passenger flow in a metro station. The model allows studies of passenger flow in stations with different layouts and facilities, thus providing valuable information, such as passenger flow and density of passenger at critical locations and passenger-handling facilities within a station, to the operators. The adoption of the concept of Petri nets in the simulation model is discussed. Examples are provided to demonstrate its application to passenger flow analysis, train scheduling and the testing of alternative station layouts.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Probabilistic load flow techniques have been adopted in AC electrified railways to study the load demand under various train service conditions. This paper highlights the differences in probabilistic load flow analysis between the usual power systems and power supply systems in AC railways; discusses the possible difficulties in problem formulation and presents the link between train movement and the corresponding power demand for load flow calculation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Power load flow analysis is essential for system planning, operation, development and maintenance. Its application on railway supply system is no exception. Railway power supplies system distinguishes itself in terms of load pattern and mobility, as well as feeding system structure. An attempt has been made to apply probability load flow (PLF) techniques on electrified railways in order to examine the loading on the feeding substations and the voltage profiles of the trains. This study is to formulate a simple and reliable model to support the necessary calculations for probability load flow analysis in railway systems with autotransformer (AT) feeding system, and describe the development of a software suite to realise the computation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article integrates the material/energy flow analysis into a production frontier framework to quantify resource efficiency (RE). The emergy content of natural resources instead of their mass content is used to construct aggregate inputs. Using the production frontier approach, aggregate inputs will be optimised relative to given output quantities to derive RE measures. This framework is superior to existing RE indicators currently used in the literature. Using the exergy/emergy content in constructing aggregate material or energy flows overcomes a criticism that mass content cannot be used to capture different quality of differing types of resources. Derived RE measures are both ‘qualitative’ and ‘quantitative’, whereas existing RE indicators are only qualitative. An empirical examination into the RE of 116 economies was undertaken to illustrate the practical applicability of the new framework. The results showed that economies, on average, could reduce the consumption of resources by more than 30% without any reduction in per capita gross domestic product (GDP). This calculation occurred after adjustments for differences in the purchasing power of national currencies. The existence of high variations in RE across economies was found to be positively correlated with participation of people in labour force, population density, urbanisation, and GDP growth over the past five years. The results also showed that economies of a higher income group achieved higher RE, and those economies that are more dependent on imports and primary industries would have lower RE performance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Experts in injection molding often refer to previous solutions to find a mold design similar to the current mold and use previous successful molding process parameters with intuitive adjustment and modification as a start for the new molding application. This approach saves a substantial amount of time and cost in experimental based corrective actions which are required in order to reach optimum molding conditions. A Case-Based Reasoning (CBR) System can perform the same task by retrieving a similar case which is applied to the new case from the case library and uses the modification rules to adapt a solution to the new case. Therefore, a CBR System can simulate human e~pertise in injection molding process design. This research is aimed at developing an interactive Hybrid Expert System to reduce expert dependency needed on the production floor. The Hybrid Expert System (HES) is comprised of CBR, flow analysis, post-processor and trouble shooting systems. The HES can provide the first set of operating parameters in order to achieve moldability condition and producing moldings free of stress cracks and warpage. In this work C++ programming language is used to implement the expert system. The Case-Based Reasoning sub-system is constructed to derive the optimum magnitude of process parameters in the cavity. Toward this end the Flow Analysis sub-system is employed to calculate the pressure drop and temperature difference in the feed system to determine the required magnitude of parameters at the nozzle. The Post-Processor is implemented to convert the molding parameters to machine setting parameters. The parameters designed by HES are implemented using the injection molding machine. In the presence of any molding defect, a trouble shooting subsystem can determine which combination of process parameters must be changed iii during the process to deal with possible variations. Constraints in relation to the application of this HES are as follows. - flow length (L) constraint: 40 mm < L < I 00 mm, - flow thickness (Th) constraint: -flow type: - material types: I mm < Th < 4 mm, unidirectional flow, High Impact Polystyrene (HIPS) and Acrylic. In order to test the HES, experiments were conducted and satisfactory results were obtained.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Existing secure software development principles tend to focus on coding vulnerabilities, such as buffer or integer overflows, that apply to individual program statements, or issues associated with the run-time environment, such as component isolation. Here we instead consider software security from the perspective of potential information flow through a program’s object-oriented module structure. In particular, we define a set of quantifiable "security metrics" which allow programmers to quickly and easily assess the overall security of a given source code program or object-oriented design. Although measuring quality attributes of object-oriented programs for properties such as maintainability and performance has been well-covered in the literature, metrics which measure the quality of information security have received little attention. Moreover, existing securityrelevant metrics assess a system either at a very high level, i.e., the whole system, or at a fine level of granularity, i.e., with respect to individual statements. These approaches make it hard and expensive to recognise a secure system from an early stage of development. Instead, our security metrics are based on well-established compositional properties of object-oriented programs (i.e., data encapsulation, cohesion, coupling, composition, extensibility, inheritance and design size), combined with data flow analysis principles that trace potential information flow between high- and low-security system variables. We first define a set of metrics to assess the security quality of a given object-oriented system based on its design artifacts, allowing defects to be detected at an early stage of development. We then extend these metrics to produce a second set applicable to object-oriented program source code. The resulting metrics make it easy to compare the relative security of functionallyequivalent system designs or source code programs so that, for instance, the security of two different revisions of the same system can be compared directly. This capability is further used to study the impact of specific refactoring rules on system security more generally, at both the design and code levels. By measuring the relative security of various programs refactored using different rules, we thus provide guidelines for the safe application of refactoring steps to security-critical programs. Finally, to make it easy and efficient to measure a system design or program’s security, we have also developed a stand-alone software tool which automatically analyses and measures the security of UML designs and Java program code. The tool’s capabilities are demonstrated by applying it to a number of security-critical system designs and Java programs. Notably, the validity of the metrics is demonstrated empirically through measurements that confirm our expectation that program security typically improves as bugs are fixed, but worsens as new functionality is added.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Bagasse stockpile operations have the potential to lead to adverse environmental and social impacts. Dust releases can cause occupational health and safety concerns for factory workers and dust emissions impact on the surrounding community. Preliminary modelling showed that bagasse depithing would likely reduce the environmental risks, particularly dust emissions, associated with large scale bagasse stockpiling operations. Dust emission properties were measured and used for dispersion modelling with favourable outcomes. Modelling showed a 70% reduction in peak ground level concentrations of PM10 dust (particles with an aerodynamic diameter less than 10 µm) from operations on depithed bagasse stockpiles compared to similar operations on stockpiles of whole bagasse. However, the costs of a depithing operation at a sugar factory were estimated to be approximately $2.1 million in capital expenditure to process 100,000 t/y of bagasse and operating costs were approximately $200,000 p.a. The total capital cost for a 10,000 t/y operation was approximately $1.6 million. The cost of depithing based on a discounted cash flow analysis was $5.50 per tonne of bagasse for the 100,000 t/y scenario. This may make depithing prohibitively expensive in many situations if installed exclusively as a dust control measure.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many Australian families are unable to access homeownership. This is because house prices are very high to the severely or seriously unaffordable level. Therefore, many low income families will need to rely on affordable rental housing supply. The Australian governments introduced National Rental Affordability Scheme (NRAS) in July 2008. The scheme aims to increase the supply of affordable rental housing by 50,000 dwellings across Australia by June 2014. It provides financial incentive for investors to purchase new affordable housing that must be rented at a minimum of 20% below the market rent. The scheme has been in place for four years to June 2012. There are debates on the success or failure of the scheme. One argues that the scheme is more successful in Queensland but it failed to meet its aims in NSW. This paper examines NRAS incentive designed to encourage affordable housing supply in Australia and demonstrates reasons for developing properties that are crowded in areas where the land prices are relatively lower in the NSW using a discounted cash flow analysis in a hypothetical case study. The findings suggest that the high land values and the increasing cost of development were the main constraints of implementing the scheme in the NSW and government should not provide a flat rate subsidy which is inadequate to ensure that affordable housing projects in high cost areas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research investigated the effectiveness of using an eco-driving strategy at urban signalised intersections from both the individual driver and the traffic flow perspective. The project included a field driving experiment and a series of traffic simulation investigations. The study found that the prevailing eco-driving strategy has negative impacts on traffic mobility and environmental performance when the traffic is highly congested. An improved eco-driving strategy has been developed to mitigate these negative impacts.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Voltage unbalance is a major power quality problem in low voltage residential feeders due to the random location and rating of single-phase rooftop photovoltaic cells (PV). In this paper, two different improvement methods based on the application of series (DVR) and parallel (DSTATCOM) custom power devices are investigated to improve the voltage unbalance problem in these feeders. First, based on the load flow analysis carried out in MATLAB, the effectiveness of these two custom power devices is studied vis-à-vis the voltage unbalance reduction in urban and semi-urban/rural feeders containing rooftop PVs. Their effectiveness is studied from the installation location and rating points of view. Later, a Monte Carlo based stochastic analysis is carried out to investigate their efficacy for different uncertainties of load and PV rating and location in the network. After the numerical analyses, a converter topology and control algorithm is proposed for the DSTATCOM and DVR for balancing the network voltage at their point of common coupling. A state feedback control, based on pole-shift technique, is developed to regulate the voltage in the output of the DSTATCOM and DVR converters such that the voltage balancing is achieved in the network. The dynamic feasibility of voltage unbalance and profile improvement in LV feeders, by the proposed structure and control algorithm for the DSTATCOM and DVR, is verified through detailed PSCAD/EMTDC simulations.